18 research outputs found

    Interactive Sound Propagation using Precomputation and Statistical Approximations

    Get PDF
    Acoustic phenomena such as early reflections, diffraction, and reverberation have been shown to improve the user experience in interactive virtual environments and video games. These effects arise due to repeated interactions between sound waves and objects in the environment. In interactive applications, these effects must be simulated within a prescribed time budget. We present two complementary approaches for computing such acoustic effects in real time, with plausible variation in the sound field throughout the scene. The first approach, Precomputed Acoustic Radiance Transfer, precomputes a matrix that accounts for multiple acoustic interactions between all scene objects. The matrix is used at run time to provide sound propagation effects that vary smoothly as sources and listeners move. The second approach couples two techniques -- Ambient Reverberance, and Aural Proxies -- to provide approximate sound propagation effects in real time, based on only the portion of the environment immediately visible to the listener. These approaches lie at different ends of a space of interactive sound propagation techniques for modeling sound propagation effects in interactive applications. The first approach emphasizes accuracy by modeling acoustic interactions between all parts of the scene; the second approach emphasizes efficiency by only taking the local environment of the listener into account. These methods have been used to efficiently generate acoustic walkthroughs of architectural models. They have also been integrated into a modern game engine, and can enable realistic, interactive sound propagation on commodity desktop PCs.Doctor of Philosoph

    Source Directivity and Spatial Audio for Interactive Wave-based Sound Propagation

    Get PDF
    Presented at the 20th International Conference on Auditory Display (ICAD2014), June 22-25, 2014, New York, NY.This paper presents an approach to model time-varying source directivity and HRTF-based spatial audio for wave-based sound propagation at interactive rates. The source directivity is expressed as a linear combination of elementary spherical harmonic sources. The propagated sound field due to each spherical harmonic source is precomputed and stored in an offline step. At runtime, the timevarying source directivity is decomposed into spherical harmonic coefficients. These coefficients are combined with precomputed spherical harmonic sound fields to generate propagated sound field at the listener position corresponding to the directional source. In order to compute spatial audio for a moving and rotating listener, an efficient plane-wave decomposition approach based on the derivatives of the sound field is presented. The source directivity and spatial audio approach have been integrated with the Half-Life 2 game engine and the Oculus Rift head-mounted display to enable realistic acoustic effects for virtual environments and games
    corecore